Bagged Voting Ensembles

نویسندگان

  • Sotiris B. Kotsiantis
  • Panayiotis E. Pintelas
چکیده

Bayesian and decision tree classifiers are among the most popular classifiers used in the data mining community and recently numerous researchers have examined their sufficiency in ensembles. Although, many methods of ensemble creation have been proposed, there is as yet no clear picture of which method is best. In this work, we propose Bagged Voting using different subsets of the same training dataset with the concurrent usage of a voting methodology that combines a Bayesian and a decision tree algorithm. In our algorithm, voters express the degree of their preference using as confidence score the probabilities of classifiers' prediction. Next all confidence values are added for each candidate and the candidate with the highest sum wins the election. We performed a comparison of the presented ensemble with other ensembles that use either the Bayesian or the decision tree classifier as base learner and we took better accuracy in most cases.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A comparison of stacking with meta decision trees to other combining methods

Meta decision trees (MDTs) are a method for combining multiple classifiers. We present an integration of the algorithm MLC4.5 for learning MDTs into the Weka data mining suite. We compare classifier ensembles combined with MDTs to bagged and boosted decision trees, and to classifier ensembles combined with other methods: voting, grading, multi-scheme and stacking with multi-response linear regr...

متن کامل

Tuning diversity in bagged neural network ensembles

In this paper we address the issue of how to optimize the generalization performance of bagged neural network ensembles. We investigate how diversity amongst networks in bagged ensembles can signiicantly innuence ensemble generalization performance and propose a new early-stopping technique that eeectively tunes this diversity so that overall ensemble generalization performance is optimized. Ex...

متن کامل

Cancer recognition with bagged ensembles of support vector machines

Expression-based classification of tumors requires stable, reliable and variance reduction methods, as DNA microarray data are characterized by low size, high dimensionality, noise and large biological variability. In order to address the variance and curse of dimensionality problems arising from this difficult task, we propose to apply bagged ensembles of Support Vector Machines (SVM) and feat...

متن کامل

The NeuralBAG algorithm: optimizing generalization performance in bagged neural networks

In this paper we propose an algorithm we call \NeuralBAG" that estimates the set of weights and number of hidden units each network in a bagged ensemble should have so that the generalization performance of the ensemble is optimized. Experiments performed on noisy synthetic data demonstrate the potential of the algorithm. On average, ensembles trained using NeuralBAG out-perform bagged networks...

متن کامل

Comparing ensembles of decision trees and neural networks for one-day-ahead streamflow prediction

Ensemble learning methods have received remarkable attention in the recent years and led to considerable advancement in the performance of the regression and classification problems. Bagging and boosting are among the most popular ensemble learning techniques proposed to reduce the prediction error of learning machines. In this study, bagging and gradient boosting algorithms are incorporated in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004